Planning with Learned Entity Prompts for Abstractive Summarization

نویسندگان

چکیده

Abstract We introduce a simple but flexible mechanism to learn an intermediate plan ground the generation of abstractive summaries. Specifically, we prepend (or prompt) target summaries with entity chains—ordered sequences entities mentioned in summary. Transformer-based sequence-to-sequence models are then trained generate chain and continue generating summary conditioned on input. experimented both pretraining finetuning this content planning objective. When evaluated CNN/DailyMail, XSum, SAMSum, BillSum, demonstrate empirically that grounded objective improves specificity for all datasets, achieves state-of-the-art performance XSum SAMSum terms rouge. Moreover, chains provides control hallucinations By prompting decoder modified drops hallucinated entities, outperform approaches faithfulness when automatically by humans.

برای دانلود رایگان متن کامل این مقاله و بیش از 32 میلیون مقاله دیگر ابتدا ثبت نام کنید

اگر عضو سایت هستید لطفا وارد حساب کاربری خود شوید

منابع مشابه

Text Generation for Abstractive Summarization

We have begun work on a framework for abstractive summarization and decided to focus on a module for text generation. For TAC 2010, we thus move away from sentence extraction. Each sentence in the summary we generate is based on a document sentence but it usually contains a smaller amount of information and uses fewer words. The system uses the output of a syntactic parser for a sentence and th...

متن کامل

Abstractive Summarization for Amazon Reviews

This paper focuses on feed-forward neural network with attention-based encoder to solve the challenge of abstractive summarization. We also briefly explored the potential of attentive recurrent neural network and recurrent neural network encoder-decoder. Those models were originally proposed to solve similar tasks, such as news articles summarization and machine translation; we modify and exten...

متن کامل

Controllable Abstractive Summarization

Current models for document summarization ignore user preferences such as the desired length, style or entities that the user has a preference for. We present a neural summarization model that enables users to specify such high level attributes in order to control the shape of the final summaries to better suit their needs. With user input, we show that our system can produce high quality summa...

متن کامل

Neural Abstractive Text Summarization

Abstractive text summarization is a complex task whose goal is to generate a concise version of a text without necessarily reusing the sentences from the original source, but still preserving the meaning and the key contents. We address this issue by modeling the problem as a sequence to sequence learning and exploiting Recurrent Neural Networks (RNNs). This work is a discussion about our ongoi...

متن کامل

Abstractive Meeting Summarization with Entailment and Fusion

ive Meeting Summarization with Entailment and Fusion Yashar Mehdad∗ Giuseppe Carenini∗ Frank W. Tompa∗∗ Department of Computer Science ∗University of British Columbia ∗∗University of Waterloo {mehdad, carenini}@cs.ubc.ca [email protected]

متن کامل

ذخیره در منابع من


  با ذخیره ی این منبع در منابع من، دسترسی به آن را برای استفاده های بعدی آسان تر کنید

ژورنال

عنوان ژورنال: Transactions of the Association for Computational Linguistics

سال: 2021

ISSN: ['2307-387X']

DOI: https://doi.org/10.1162/tacl_a_00438